---
title: Deployment settings
description: Add data to a deployment and configure monitoring, notifications, and challenger behavior using the Settings tab.

---

# Deployment settings {: #settings-tab }

!!! info "Deprecation notice"
        The **Settings > Data** and **Settings > Monitoring** tabs are deprecated and scheduled for removal. The new deployment settings workflow provides an organized and intuitive interface, separating the categories of deployment configuration and monitoring setup tasks into dedicated settings pages. During the deprecation period, you can use the **Data** tab; however, the **Monitoring** tab directs you to the [service health settings](service-health-settings).

You can add data to a deployment and configure monitoring, notifications, and challenger behavior using the **Settings** associated with each deployment tab:

 Topic | Describes
-------|------------
[Set up service health monitoring](service-health-settings) | Enable [segmented analysis](deploy-segment) to assess service health, data drift, and accuracy statistics by filtering them into unique segment attributes and values. 
[Set up data drift monitoring](data-drift-settings)         | Enable [data drift monitoring](data-drift) on a deployment's Data Drift Settings tab. 
[Set up accuracy monitoring](accuracy-settings)             | Enable [accuracy monitoring](deploy-accuracy) on a deployment's Accuracy Settings tab.
[Set up fairness monitoring](fairness-settings)             | Enable [fairness monitoring](mlops-fairness) on a deployment's Fairness Settings tab. 
[Set up humility rules](humility-settings)                  | Enable [humility monitoring](humble) by creating rules which enable models to recognize, in real-time, when they make uncertain predictions or receive data they have not seen before.
[Configure retraining](retraining-settings)                 | Enable [Automated Retraining](set-up-auto-retraining) for a deployment by defining the general retraining settings and then creating retraining policies.
[Configure challengers](challengers-settings)               | Enable [challenger comparison](challengers) by configuring a deployment to store prediction request data at the row level and replay predictions on a schedule.
[Review predictions settings](predictions-settings)         | Review the Predictions Settings tab to view details about your deployment's inference data.
[Enable data export](data-export-settings)                  | Enable [data export](data-export) to compute and monitor custom business or performance metrics.
[Set up custom metrics monitoring](custom-metrics-settings) | Enable [custom metrics](custom-metrics) monitoring by defining the "at risk" and "failing" thresholds for the custom metrics you created.
[Set prediction intervals for time series deployments](predictions-settings#set-prediction-intervals-for-time-series-deployments) | Enable [prediction intervals](ts-predictions#prediction-preview) in the prediction response for deployed time series models.
